Tracking continuous emotional trends of participants during affective dyadic interactions using body language and speech information

نویسندگان

  • Angeliki Metallinou
  • Athanasios Katsamanis
  • Shrikanth S. Narayanan
چکیده

We address the problem of tracking continuous levels of a participant’s activation, valence and dominance during the course of affective dyadic interactions, where participants may be speaking, listening or doing neither. To this end, we extract detailed and intuitive descriptions of each participant’s body movements, posture and behavior towards his interlocutor, and speech information. We apply a Gaussian Mixture Model-based approach which computes a mapping from a set of observed audio-visual cues to an underlying emotional state. We obtain promising results for tracking trends of participants’ activation and dominance values, which outperform other regression-based approaches used in the literature. Additionally, we shed light into the way expressive body language is modulated by underlying emotional states in the context of dyadic interactions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cross-Subject Continuous Emotion Recognition Using Speech and Body Motion in Dyadic Interactions

Dyadic interactions encapsulate rich emotional exchange between interlocutors suggesting a multimodal, cross-speaker and cross-dimensional continuous emotion dependency. This study explores the dynamic inter-attribute emotional dependency at the cross-subject level with implications to continuous emotion recognition based on speech and body motion cues. We propose a novel two-stage Gaussian Mix...

متن کامل

Use of Agreement/Disagreement Classification in Dyadic Interactions for Continuous Emotion Recognition

Natural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature or type of the dyadic interaction. In this study, we investigate relationship between affective attributes and nature of dyadic interaction. In this investigation we use the JESTKOD database, which consists of speech and...

متن کامل

The USC CreativeIT database of multimodal dyadic interactions: from speech and full body motion capture to continuous emotional annotations

Improvised acting is a viable technique to study expressive human communication and to shed light into actors’ creativity. The USC CreativeIT database provides a novel, freely-available multimodal resource for the study of theatrical improvisation and rich expressive human behavior (speech and body language) in dyadic interactions. The theoretical design of the database is based on the well-est...

متن کامل

Continuous emotion tracking using total variability space

Automatic continuous emotion tracking (CET) has received increased attention with expected applications in medical, robotic, and human-machine interaction areas. The speech signal carries useful clues to estimate the affective state of the speaker. In this paper, we present Total Variability Space (TVS) for CET from speech data. TVS is a widely used framework in speaker and language recognition...

متن کامل

Analysis and Predictive Modeling of Body Language Behavior in Dyadic Interactions From Multimodal Interlocutor Cues

During dyadic interactions, participants adjust their behavior and give feedback continuously in response to the behavior of their interlocutors and the interaction context. In this paper, we study how a participant in a dyadic interaction adapts his/her body language to the behavior of the interlocutor, given the interaction goals and context. We apply a variety of psychology-inspired body lan...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Image Vision Comput.

دوره 31  شماره 

صفحات  -

تاریخ انتشار 2013